Pii: S0893-6080(97)00032-4
نویسنده
چکیده
Analgorithm for addition and deletion (ADDEL) of resources during learning is developed to achieve two goals: (1) tofind feed-forward multilayer networks that are as small as possible, (2) tojind an appropriate structure for such small networks. These goals are accomplished by operating alternately between an adding phuse and a deleting phase while learning the given input-output associations. The adding phase develops a crude structure by jilling in resources (connections, units and layers) at a virtual multilayer network with a maximum of L possible layers. The deleting phase then removes any unnecessary connections to obtain a rejned structure. The additions and deletions are done based on a sensitivi~ measure and a corresponding probability rule so that only the synapses which are most effective in reducing the output error are preserved. A generalization error estimated from a validation set is used to control the alternation between the two learning phases and the termination of learning. Simulations, including handwritten digit recognition, demonstrate that the algon”thmis effective in finding an appropriate network structure for a small network which can generalize well. The algorithm is used to investigate when the size of a network is importantfor generalization. O 1997 Elsevier Science Ltd.
منابع مشابه
Improving support vector machine classifiers by modifying kernel functions
We propose a method of modifying a kernel function to improve the performance of a support vector machine classifier. This is based on the structure of the Riemannian geometry induced by the kernel function. The idea is to enlarge the spatial resolution around the separating boundary surface, by a conformal mapping, such that the separability between classes is increased. Examples are given spe...
متن کاملPrecision Requirements for Closed-Loop Kinematic Robotic Control Using Linear Local Mappings
Neural networks are approximation techniques that can be characterized by adaptability rather than by precision. For feedback systems, high precision can still be acquired in presence of errors. Within a general iterative framework of closed-loop kinematic robotic control using linear local modeling, the inverse Jacobian matrix error and the maximum length of the displacement for which the line...
متن کاملPii: S0893-6080(97)00012-9
Kohonen’s learning vector quantization (LVQ)is modifiedby attributingtrainingcountersto eachneuron, whichrecordits trainingstatistics.Duringtraining,thisallowsfor dynamicself-allocationof theneuronsto classes.In the classificationstage trainingcountersprovidean estimateof the reliabilityof classificationof the singleneurons, whichcan be exploitedto obtaina substantiallyhigherpurity of classi$ca...
متن کاملThe connection between regularization operators and support vector kernels
In this paper a correspondence is derived between regularization operators used in regularization networks and support vector kernels. We prove that the Green's Functions associated with regularization operators are suitable support vector kernels with equivalent regularization properties. Moreover, the paper provides an analysis of currently used support vector kernels in the view of regulariz...
متن کاملRegularization with a Pruning Prior
We investigate the use of a regularization prior and its pruning properties. We illustrate the behavior of this prior by conducting analyses both using a Bayesian framework and with the generalization method, on a simple toy problem. Results are thoroughly compared with those obtained with a traditional weight decay. Copyright 1997 Elsevier Science Ltd.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1997